Nonmonotonic Generalization Bias of Gaussian Mixture Models
نویسندگان
چکیده
منابع مشابه
Nonmonotonic Generalization Bias of Gaussian Mixture Models
Theories of learning and generalization hold that the generalization bias, defined as the difference between the training error and the generalization error, increases on average with the number of adaptive parameters. This article, however, shows that this general tendency is violated for a gaussian mixture model. For temperatures just below the first symmetry breaking point, the effective num...
متن کاملFuzzy Gaussian Mixture Models
In this paper, in order to improve both the performance and the efficiency of the conventional Gaussian Mixture Models (GMMs), generalized GMMs are firstly introduced by integrating the conventional GMMs and the active curve axis GMMs for fitting non-linear datasets, and then two types of Fuzzy Gaussian Mixture Models (FGMMs) with a faster convergence process are proposed based on the generaliz...
متن کاملParsimonious Gaussian mixture models
Parsimonious Gaussian mixture models are developed using a latent Gaussian model which is closely related to the factor analysis model. These models provide a unified modeling framework which includes the mixtures of probabilistic principal component analyzers and mixtures of factor of analyzers models as special cases. In particular, a class of eight parsimonious Gaussian mixture models which ...
متن کاملGaussian Mixture Models
Definition A Gaussian Mixture Model (GMM) is a parametric probability density function represented as a weighted sum of Gaussian component densities. GMMs are commonly used as a parametric model of the probability distribution of continuous measurements or features in a biometric system, such as vocal-tract related spectral features in a speaker recognition system. GMM parameters are estimated ...
متن کاملDeep Gaussian Mixture Models
Deep learning is a hierarchical inference method formed by subsequent multiple layers of learning able to more efficiently describe complex relationships. In this work, Deep Gaussian Mixture Models are introduced and discussed. A Deep Gaussian Mixture model (DGMM) is a network of multiple layers of latent variables, where, at each layer, the variables follow a mixture of Gaussian distributions....
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Neural Computation
سال: 2000
ISSN: 0899-7667,1530-888X
DOI: 10.1162/089976600300015439